Sixty years ago, at the Institute for Advanced Study in Princeton, New Jersey, a 32 x 32 x 40-bit matrix of 24-microsecond random access memory was undergoing initial tests. John von Neumann (1903-1957) succeeded in jump-starting the digital revolution by bringing engineers into the den of the mathematicians, rather than by bringing mathematicians into a den of engineers. This implementation of Alan Turing’s Universal Machine broke the distinction between numbers that mean things and numbers that do things, and the world would never be the same. With 5 kilobytes of storage, von Neumann and colleagues tackled previously intractable problems ranging from thermonuclear explosions, stellar evolution, and long-range weather forecasting to cellular automata, network optimization, and the origins of life. Codes were small enough to be completely debugged, but hardware could not be counted on to perform consistently from one kilocycle to the next. This situation is now reversed.
Uploaded by chris85 on